151 research outputs found

    The Niceness of Unique Sink Orientations

    Full text link
    Random Edge is the most natural randomized pivot rule for the simplex algorithm. Considerable progress has been made recently towards fully understanding its behavior. Back in 2001, Welzl introduced the concepts of \emph{reachmaps} and \emph{niceness} of Unique Sink Orientations (USO), in an effort to better understand the behavior of Random Edge. In this paper, we initiate the systematic study of these concepts. We settle the questions that were asked by Welzl about the niceness of (acyclic) USO. Niceness implies natural upper bounds for Random Edge and we provide evidence that these are tight or almost tight in many interesting cases. Moreover, we show that Random Edge is polynomial on at least nΩ(2n)n^{\Omega(2^n)} many (possibly cyclic) USO. As a bonus, we describe a derandomization of Random Edge which achieves the same asymptotic upper bounds with respect to niceness and discuss some algorithmic properties of the reachmap.Comment: An extended abstract appears in the proceedings of Approx/Random 201

    A subexponential algorithm for abstract optimization problems

    Get PDF
    An Abstract Optimization Problem AOP is a triple H where H is a finite set a total order on H and an oracle that for given F G H either reports that F min fF Gg or returns a set F G with F F To solve the problem means to find the minimum set in H We present a randomized algorithm that solves any AOP with an expected number of at most e p nO p n ln n oracle calls n jHj In contrast any deterministic algorithm needs to make n oracle calls in the worst case The algorithm is applied to the problem of finding the minimum distance between two n vertex or nfacet polyhedra in dspace and to the computation of the smallest ball containing n points in dspace for both problems we give the first subexponential bounds in d

    An Exponential Lower Bound on the Complexity of Regularization Paths

    Full text link
    For a variety of regularized optimization problems in machine learning, algorithms computing the entire solution path have been developed recently. Most of these methods are quadratic programs that are parameterized by a single parameter, as for example the Support Vector Machine (SVM). Solution path algorithms do not only compute the solution for one particular value of the regularization parameter but the entire path of solutions, making the selection of an optimal parameter much easier. It has been assumed that these piecewise linear solution paths have only linear complexity, i.e. linearly many bends. We prove that for the support vector machine this complexity can be exponential in the number of training points in the worst case. More strongly, we construct a single instance of n input points in d dimensions for an SVM such that at least \Theta(2^{n/2}) = \Theta(2^d) many distinct subsets of support vectors occur as the regularization parameter changes.Comment: Journal version, 28 Pages, 5 Figure
    • …
    corecore